Multi-Step Stochastic ADMM in High Dimensions: Applications to Sparse Optimization and Noisy Matrix Decomposition

نویسندگان

  • Hanie Sedghi
  • Anima Anandkumar
  • Edmond Jonckheere
چکیده

We propose an efficient ADMM method with guarantees for high-dimensional problems. We provide explicit bounds for the sparse optimization problem and the noisy matrix decomposition problem. For sparse optimization, we establish that the modified ADMM method has an optimal regret bound of O(s log d/T ), where s is the sparsity level, d is the data dimension and T is the number of steps. This matches with the minimax lower bounds for sparse estimation. For matrix decomposition into sparse and low rank components, we provide the first guarantees for any online method, and prove a regret bound of Õ((s + r)β(p)/T ) + O(1/p) for a p × p matrix, where s is the sparsity level, r is the rank and Θ( √ p) ≤ β(p) ≤ Θ(p). Our guarantees match the minimax lower bound with respect to s, r and T . In addition, we match the minimax lower bound with respect to the matrix dimension p, i.e. β(p) = Θ( √ p), for many important statistical models including the independent noise model, the linear Bayesian network and the latent Gaussian graphical model under some conditions. Our ADMM method is based on epochbased annealing and consists of inexpensive steps which involve projections on to simple norm balls.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-Step Stochastic ADMM in High Dimensions: Applications to Sparse Optimization and Matrix Decomposition

In this paper, we consider a multi-step version of the stochastic ADMM method with efficient guarantees for high-dimensional problems. We first analyze the simple setting, where the optimization problem consists of a loss function and a single regularizer (e.g. sparse optimization), and then extend to the multi-block setting with multiple regularizers and multiple variables (e.g. matrix decompo...

متن کامل

Multi-Step Stochastic ADMM in High Dimensions: Applications in Sparse Optimization and Noisy Matrix Decomposition

We propose an efficient ADMM method with guarantees for high-dimensional problems. We provide explicit bounds for the sparse optimization problem and the noisy matrix decomposition problem. For sparse optimization, we establish that the modified ADMM method has an optimal regret bound of O(s log d/T ), where s is the sparsity level, d is the data dimension and T is the number of steps. This mat...

متن کامل

Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions

We analyze a class of estimators based on convex relaxation for solving high-dimensional matrix decomposition problems. The observations are noisy realizations of a linear transformation X of the sum of an (approximately) low rank matrix Θ⋆ with a second matrix Γ⋆ endowed with a complementary form of low-dimensional structure; this set-up includes many statistical models of interest, including ...

متن کامل

Noisy Matrix Decomposition via Convex Relaxation: Optimal Rates in High Dimensions1 by Alekh Agarwal2, Sahand Negahban3 And

We analyze a class of estimators based on convex relaxation for solving high-dimensional matrix decomposition problems. The observations are noisy realizations of a linear transformation X of the sum of an (approximately) low rank matrix with a second matrix endowed with a complementary form of low-dimensional structure; this set-up includes many statistical models of interest, including factor...

متن کامل

Scalable Stochastic Alternating Direction Method of Multipliers

Alternating direction method of multipliers (ADMM) has been widely used in many applications due to its promising performance to solve complex regularization problems and large-scale distributed optimization problems. Stochastic ADMM, which visits only one sample or a mini-batch of samples each time, has recently been proved to achieve better performance than batch ADMM. However, most stochasti...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014